32 research outputs found

    SPATA: Spatio-tangible tools for fabrication-aware design

    Get PDF
    The physical tools used when designing new objects for digital fabrication are mature, yet disconnected from their virtual accompaniments. SPATA is the digital adaptation of two spatial measurement tools, that explores their closer integration into virtual design environments. We adapt two of the traditional measurement tools: calipers and protractors. Both tools can measure, transfer, and present size and angle. Their close integration into different design environments makes tasks more fluid and convenient. We describe the tools' design, a prototype implementation, integration into different environments, and application scenarios validating the concept

    Portallax:bringing 3D displays capabilities to handhelds

    Get PDF
    We present Portallax, a clip-on technology to retrofit mobile devices with 3D display capabilities. Available technologies (e.g. Nintendo 3DS or LG Optimus) and clip-on solutions (e.g. 3DeeSlide and Grilli3D) force users to have a fixed head and device positions. This is contradictory to the nature of a mobile scenario, and limits the usage of interaction techniques such as tilting the device to control a game. Portallax uses an actuated parallax barrier and face tracking to realign the barrier's position to the user's position. This allows us to provide stereo, motion parallax and perspective correction cues in 60 degrees in front of the device. Our optimized design of the barrier minimizes colour distortion, maximizes resolution and produces bigger view-zones, which support ~81% of adults' interpupillary distances and allow eye tracking implemented with the front camera. We present a reference implementation, evaluate its key features and provide example applications illustrating the potential of Portallax

    FingerSlide: Investigating Passive Haptic Sliding As A Tacton Channel

    Get PDF
    The haptic sensation of sliding a surface under a probing finger can be used to convey surface information or coded data to the user. In this paper, we investigate users' ability to discern different sliding profiles based on the velocity and direction of sliding for use as haptic-tactons. We built FingerSlide, a novel haptic device which can position and control moving surfaces under a user's finger and used this to run two independent studies. The first study investigates if users can identify the direction of sliding at different velocities. The second study investigates if the users can distinguish a difference between two velocities. Our results show a faster response for higher velocities in the direction study and high error rates in identifying differences in the direction study. We discuss these results and infer design considerations for haptic devices that use the sliding effect to convey information

    Exploring interactions with physically dynamic bar charts

    Get PDF
    Visualizations such as bar charts help users reason about data, but are mostly screen-based, rarely physical, and almost never physical and dynamic. This paper investigates the role of physically dynamic bar charts and evaluates new interactions for exploring and working with datasets rendered in dynamic physical form. To facilitate our exploration we constructed a 10x10 interactive bar chart and designed interactions that supported fundamental visualisation tasks, specifically; annotation, filtering, organization, and navigation. The interactions were evaluated in a user study with 17 participants. Our findings identify the preferred methods of working with the data for each task i.e. directly tapping rows to hide bars, highlight the strengths and limitations of working with physical data, and discuss the challenges of integrating the proposed interactions together into a larger data exploration system. In general, physical interactions were intuitive, informative, and enjoyable, paving the way for new explorations in physical data visualizations

    Design and Analysis of Haptic-Audio Based System for the Visually Impaired to Shop Online

    Get PDF
    Many visually impaired customers are keen to shop online, however, they often encounter accessibility barriers such as accessing and interpreting complex designed websites and when trying to make online payment which required them to input card details by filling up payment form. In order to study whether the visually impaired could shop online without assistance, an online store which has features such as product catalogue, shopping cart and payment system was developed. The system utilizes the Falcon haptic device and voice recognition for navigation, interaction, accessing and haptic evaluation of products. Some of our qualitative analysis suggests that a framed three-section design product catalogue with directed dialogue, directional cues, audio information along with a haptic-audio enabled browser is feasible for the visually impaired to browse, select and haptically evaluate products; a XHTML and VoiceXML based shopping cart system can enable the visually impaired to interact and verify its contents; and a voice password based payment system can be used to automate forms data entry process and to help the visually impaired to make online payment independently

    Kick: investigating the use of kick gestures for mobile interactions

    Get PDF
    In this paper we describe the use of kick gestures for interaction with mobile devices. Kicking is a well-st udied leg action that can be harnessed in mobile contexts where the hands are busy or too dirty to interact with the phone. In this paper we examine the design space of kicki ng as an interaction technique through two user studies. The first study investigated how well users were able to control the direction of their kicks. Users were able to aim their kicks best when the movement range is divided into segments of at least 24°. In the second study we looked at the velocity of a kick. We found that the users are able to kick with at least two varying velocities. However, they also often undershoot the target velocity. Finally, we propose some specific applications in which kicks can prove beneficial

    SensaBubble: a chrono-sensory mid-air display of sight and smell

    Get PDF
    We present SensaBubble, a chrono-sensory mid-air display system that generates scented bubbles to deliver information to the user via a number of sensory modalities. The system reliably produces single bubbles of specific sizes along a directed path. Each bubble produced by SensaBubble is filled with fog containing a scent relevant to the notification. The chrono-sensory aspect of SensaBubble means that information is presented both temporally and multimodally. Temporal information is enabled through two forms of persistence: firstly, a visual display projected onto the bubble which only endures until it bursts; secondly, a scent released upon the bursting of the bubble slowly disperses and leaves a longer-lasting perceptible trace of the event. We report details of SensaBubble’s design and implementation, as well as results of technical and user evaluations. We then discuss and demonstrate how SensaBubble can be adapted for use in a wide range of application contexts – from an ambient peripheral display for persistent alerts, to an engaging display for gaming or education

    Step into My Mind Palace:Exploration of a Collaborative Paragogy Tool in VR

    Get PDF
    Virtual Reality (VR) can mediate remote collaborative learning and can support pedagogical processes like paragogy. Within education, methods such as spaced repetition and memory palaces exist to support the cognitive process of remembering. We identify an opportunity to enhance learner-led collaborative paragogy involving these methods through immersive VR experiences. We present CleVR, a VR-mediated collaboration-based system that supports the memory palace and spaced repetition techniques. As an exploratory study, we aim to identify the applicability, viability and user perception for such a system combining these two techniques in VR. CleVR is a novel implementation which provides a location-driven metaphor to populate and present multiple resources related to a topic for peer-led exploration. We discuss the design and provide a prototype implementation of CleVR. We conducted two studies, a targeted expert user review and a broader proof of concept survey. The results of the studies show interesting outcomes, with the system described as ‘engaging’, ‘useful’ and ‘fun’. Our findings provide insights to the potential of using Virtual Reality Learning Environments (VRLE) geared towards collaborative learner-led activities

    Collaborating around digital tabletops: children’s physical strategies from the UK, India and Finland

    Get PDF
    We present a study of children collaborating around interactive tabletops in three different countries: the United Kingdom, India and Finland. Our data highlights the key distinctive physical strategies used by children when performing collaborative tasks during this study. Children in the UK tend to prefer static positioning with minimal physical contact and simultaneous object movement. Children in India employed dynamic positioning with frequent physical contact and simultaneous object movement. Children in Finland used a mixture of dynamic and static positioning with minimal physical contact and object movement. Our findings indicate the importance of understanding collaboration strategies and behaviours when designing and deploying interactive tabletops in heterogeneous educational environments. We conclude with a discussion on how designers of tabletops for schools can provide opportunities for children in different countries to define and shape their own collaboration strategies for small group learning that take into account their different classroom practices
    corecore